OPTIMIZATION TECHNOLOGY CENTER Argonne National Laboratory and Northwestern University NUMERICAL EXPERIENCE WITH A REDUCED HESSIAN METHOD FOR LARGE SCALE CONSTRAINED OPTIMIZATION by
نویسندگان
چکیده
The reduced Hessian SQP algorithm presented in is developed in this paper into a practical method for large scale optimization The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term ZWY pY This improves the stability and robustness of the algorithm without increasing its computational cost The paper studies how to implement the algorithm e ciently and presents a set of tests illustrating its numerical performance An analytic exam ple showing the bene ts of the correction term is also presented
منابع مشابه
Numerical Experience with a Reduced Hessian Method for Large Scale Constrained Optimization
We propose a quasi-Newton algorithm for solving large optimization problems with nonlinear equality constraints. It is designed for problems with few degrees of freedom, and is motivated by the need to use sparse matrix factorizations. The algorithm incorporates a correction vector that approximates the cross term Z^WYpy in order to estimate the curvature in both the range and null spaces of th...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملA Stochastic Quasi-Newton Method for Large-Scale Optimization
The question of how to incorporate curvature information in stochastic approximation methods is challenging. The direct application of classical quasiNewton updating techniques for deterministic optimization leads to noisy curvature estimates that have harmful effects on the robustness of the iteration. In this paper, we propose a stochastic quasi-Newton method that is efficient, robust and sca...
متن کاملNORTHWESTERN UNIVERSITY Department of Electrical Engineering and Computer Science L BFGS B FORTRAN SUBROUTINES FOR LARGE SCALE BOUND CONSTRAINED OPTIMIZATION by
L BFGS B is a limited memory algorithm for solving large nonlinear optimization problems subject to simple bounds on the variables It is intended for problems in which information on the Hessian matrix is di cult to obtain or for large dense problems L BFGS B can also be used for unconstrained problems and in this case performs similarly to its predecessor algorithm L BFGS Harwell routine VA Th...
متن کاملOn the Use of Stochastic Hessian Information in Unconstrained Optimization
This paper describes how to incorporate stochastic curvature information in a NewtonCG method and in a limited memory quasi-Newton method for large scale optimization. The motivation for this work stems from statistical learning and stochastic optimization applications in which the objective function is the sum of a very large number of loss terms, and can be evaluated with a varying degree of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1993